PhysRevE Convergence of stochastic learning in perceptrons with binary synapses
نویسندگان
چکیده
The efficacy of a biological synapse is naturally bounded, and at some resolution, latest at the level of single vesicles, it is discrete. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e. the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new memories (palimpsest property). Moreover, finding the discrete synaptic strengths which enable the classification of linearly separable patterns is a combinatorially hard problem known to be NP-complete. Here we show that learning with discrete (binary) synapses is nevertheless possible with high probability if a randomly selected fraction of synapses is modified following each stimulus presentation (slow stochastic learning). As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case of classical perceptron learning. We prove that for linearly separable classes of patterns the stochastic learning algorithm converges with arbitrary high probability in a finite number of presentations, provided that the number of neurons encoding the patterns is large enough. The stochastic learning algorithm is successfully applied to a standard classification problem of non-linearly separable patterns by using multiple, stochastically independent output units, with an achieved performance which is comparable to the maximal ones reached for the task.
منابع مشابه
Convergence of stochastic learning in perceptrons with binary synapses.
The efficacy of a biological synapse is naturally bounded, and at some resolution, and is discrete at the latest level of single vesicles. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e., the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new...
متن کاملOrigin of the computational hardness for learning with binary synapses
Through supervised learning in a binary perceptron one is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights. However, to find such assignments in practice is quite a nontrivial task. The relation between the weight space structure and the algorithmic hardness has not yet been fully understood. To this end, we analytically derive the Franz-...
متن کاملRecurrent network of perceptrons with three state synapses achieves competitive classification on real inputs
We describe an attractor network of binary perceptrons receiving inputs from a retinotopic visual feature layer. Each class is represented by a random subpopulation of the attractor layer, which is turned on in a supervised manner during learning of the feed forward connections. These are discrete three state synapses and are updated based on a simple field dependent Hebbian rule. For testing, ...
متن کاملOptimal Properties of Analog Perceptrons with Excitatory Weights
The cerebellum is a brain structure which has been traditionally devoted to supervised learning. According to this theory, plasticity at the Parallel Fiber (PF) to Purkinje Cell (PC) synapses is guided by the Climbing fibers (CF), which encode an 'error signal'. Purkinje cells have thus been modeled as perceptrons, learning input/output binary associations. At maximal capacity, a perceptron wit...
متن کاملStochastic learning in a neural network with adapting synapses
We consider a neural network with adapting synapses whose dynamics can be analitically computed. The model is made of N neurons and each of them is connected to K input neurons chosen at random in the network. The synapses are n-states variables which evolve in time according to Stochastic Learning rules; a parallel stochastic dynamics is assumed for neurons. Since the network maintains the sam...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005